perm filename NOTES.REA[LSP,JRA] blob
sn#092468 filedate 1974-03-16 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00002 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002 Preface: read the table of contents and the index first to get the
C00010 ENDMK
C⊗;
Preface: read the table of contents and the index first to get the
"flavor". Then read the following.
This is to introduce the text to the cognoscenti, rather than the unsuspecting
student. There is only one reason to be mysterious, but I found it
to be an important one when introducing the material to the
uninitiated. In particular the mystery involves the introduction of %3eval%*.
When faced with evaluation of LISP --- composition, conditional expressions,
lambda expressions, call-by-value,...--- layered with M-to S-expr translation,
and topped with %3eval%*, it tends to be must too much unless a great amount
of time is taken to reinforce the ideas. The processes are quite straightforward
and should not be so difficult to understand. So my approach is the following:
There are many intuitive algorithms-- differentiation, for one-- which
involve all of the details of the mapping process used in %3eval%* without
involving requiring understanding of a %anew%* algorithm at the same time.
Use some ... there is nothing sacred
about differentiation. As homework I give the section named %2 The Great Mothers%*.
This is done, cold turkey,
much before %3eval%* is even mentioned---still doing examples and applications.
%3tgmoaf%* and friends, which are clearly subsets of %3eval%* are worked
simply as rather tedious problems. Most students who have not seen %3eval%* before
do not realize what %3tgmoaf%* has done to them. Then when %3eval%* is described
you have two thing going for you:
1) the mappings: Mexpr => Sexpr ; call-by-value+ conds => %3eval%*;
are analoguos to intuitive examples like:
polynomials => Cambridge polish; intuitive differentiation => %3diff%*
2) they have already been doing evaluation of Sexprs translates without knowing it
and without "getting upset".
IT WORKS, and works well !!
λ-calculus is easy to introduce... it's "what do you need to store
a function definition in an %3assoc%*-like symbol table?"
After the a-list version of %3eval%* is digested,
LISP is extended to include %3label%*,functional args, and %3prog%*s.
Special forms are described in dealing and finally (retch) a section
on the machine.
The next section
goes on to implementation. Better symbol tables in the sense of
permanence (of storage of definitions), efficiency, generalizing calling (fexprs and
macros), and mulitple "values" (property-lists).
Implementation of the primitives, hashing, garbage collection is begun.
Two graphical languages are introduced here. AMBIT/G is good for describing
the structure of the primitives and garbage collection, but does not
describe the dynamics of activation and environments well. The Contour model is
used for that in the notes but I think I can do better now. The sections
on graphics are very incomplete since I don't like to draw pictures more than once.
Graphical languages are very good; they carry a lot of information with them.
Macros and special forms are described as user definable calling sequences
(Conniver and Muddle calling was given as an extension of %3eval%* earlier)
Shallow binding is introduced and a subset of the PDP-10 is given in preparation
for the next section on compilers.
Finally the PDP-10 %3eval%* is given.
Final section on compilers used the %2great mothers%* again. Three
compilers are given. First for constant args, functions, and composition
(corresponds to %3tgmoaf%*), then conditionals are added, corresponding to %3tgmoafr%*.
One pass assemblers are discussed here since conditionals usually invite
forward references.
The last compiler handles local variables (it's essentially from the
London paper ...w.o. internal lambdas..they're given as a problem)
Global variables, deep and shallow binding are discussed, followed by
tracing ... the CALL-PUSHJ mechanism. The section of functional args
again is lots of pictures to be (sigh) drawn.
Bobrow-Wegbreit and Weizenbaum here.
Compilation
of macros and special forms finishes things. Numbers are in the wrong section.
Next section has other data structures... arrays in LISP... then
string processors and linear LISP. This is used to introduce
compacting g.c.....needs more. Finally %3rplaca%* and %3rplacd%*
are given...with their uses and dangers. Applications including assembler
and system insides of LISP.
Sections to be written... inplications to other languages is written but not
in.
Needs lots of problems.
Section on AI and applications is "meta writtten".
So the text breaks into four sections now...mechanics of the language; evaluation;
implementation; and compilation. The emphasis is obviously on the structure of the
language rather than appliactions. I believe that this is the strong point about
LISP. Joe Hacker's algorithms can be written in ANY language, but understnading
programming languages is best done by understanding LISP.